Learning Log-Determinant Divergences for Positive Definite Matrices
نویسندگان
چکیده
منابع مشابه
Learning Discriminative αβ-Divergences for Positive Definite Matrices
Symmetric positive definite (SPD) matrices are useful for capturing second-order statistics of visual data. To compare two SPD matrices, several measures are available, such as the affine-invariant Riemannian metric, Jeffreys divergence, Jensen-Bregman logdet divergence, etc.; however, their behaviors may be application dependent, raising the need of manual selection to achieve the best possibl...
متن کاملA New Determinant Inequality of Positive Semi-Definite Matrices
A new determinant inequality of positive semidefinite matrices is discovered and proved by us. This new inequality is useful for attacking and solving a variety of optimization problems arising from the design of wireless communication systems. I. A NEW DETERMINANT INEQUALITY The following notations are used throughout this article. The notations [·] and [·] stand for transpose and Hermitian tr...
متن کاملInfinite-dimensional Log-Determinant divergences II: Alpha-Beta divergences
This work presents a parametrized family of divergences, namely Alpha-Beta LogDeterminant (Log-Det) divergences, between positive definite unitized trace class operators on a Hilbert space. This is a generalization of the Alpha-Beta Log-Determinant divergences between symmetric, positive definite matrices to the infinite-dimensional setting. The family of Alpha-Beta Log-Det divergences is highl...
متن کاملSupervised LogEuclidean Metric Learning for Symmetric Positive Definite Matrices
Metric learning has been shown to be highly effective to improve the performance of nearest neighbor classification. In this paper, we address the problem of metric learning for symmetric positive definite (SPD) matrices such as covariance matrices, which arise in many real-world applications. Naively using standard Mahalanobis metric learning methods under the Euclidean geometry for SPD matric...
متن کاملRiemannian Metric Learning for Symmetric Positive Definite Matrices
Over the past few years, symmetric positive definite matrices (SPD) have been receiving considerable attention from computer vision community. Though various distance measures have been proposed in the past for comparing SPD matrices, the two most widely-used measures are affine-invariant distance and log-Euclidean distance. This is because these two measures are true geodesic distances induced...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Pattern Analysis and Machine Intelligence
سال: 2021
ISSN: 0162-8828,2160-9292,1939-3539
DOI: 10.1109/tpami.2021.3073588